12 research outputs found

    Secure and Efficient Models for Retrieving Data from Encrypted Databases in Cloud

    Get PDF
    Recently, database users have begun to use cloud database services to outsource their databases. The reason for this is the high computation speed and the huge storage capacity that cloud owners provide at low prices. However, despite the attractiveness of the cloud computing environment to database users, privacy issues remain a cause for concern for database owners since data access is out of their control. Encryption is the only way of assuaging users’ fears surrounding data privacy, but executing Structured Query Language (SQL) queries over encrypted data is a challenging task, especially if the data are encrypted by a randomized encryption algorithm. Many researchers have addressed the privacy issues by encrypting the data using deterministic, onion layer, or homomorphic encryption. Nevertheless, even with these systems, the encrypted data can still be subjected to attack. In this research, we first propose an indexing scheme to encode the original table’s tuples into bit vectors (BVs) prior to the encryption. The resulting index is then used to narrow the range of retrieved encrypted records from the cloud to a small set of records that are candidates for the user’s query. Based on the indexing scheme, we then design three different models to execute SQL queries over the encrypted data. The data are encrypted by a single randomized encryption algorithm, namely the Advanced Encryption Standard AES-CBC. In each proposed scheme, we use a different (secure) method for storing and maintaining the index values (BVs) (i.e., either at user’s side or at the cloud server), and we extend each system to support most of relational algebra operators, such as select, join, etc. Implementation and evaluation of the proposed systems reveals that they are practical and efficient at reducing both the computation and space overhead when compared with state-of-the-art systems like CryptDB

    Automatic neonatal sleep stage classification:A comparative study

    Get PDF
    Sleep is an essential feature of living beings. For neonates, it is vital for their mental and physical development. Sleep stage cycling is an important parameter to assess neonatal brain and physical development. Therefore, it is crucial to administer newborn's sleep in the neonatal intensive care unit (NICU). Currently, Polysomnography (PSG) is used as a gold standard method for classifying neonatal sleep patterns, but it is expensive and requires a lot of human involvement. Over the last two decades, multiple researchers are working on automatic sleep stage classification algorithms using electroencephalography (EEG), electrocardiography (ECG), and video. In this study, we present a comprehensive review of existing algorithms for neonatal sleep, their limitations and future recommendations. Additionally, a brief comparison of the extracted features, classification algorithms and evaluation parameters is reported in the proposed study

    Automatic neonatal sleep stage classification: A comparative study

    Get PDF
    Sleep is an essential feature of living beings. For neonates, it is vital for their mental and physical development. Sleep stage cycling is an important parameter to assess neonatal brain and physical development. Therefore, it is crucial to administer newborn's sleep in the neonatal intensive care unit (NICU). Currently, Polysomnography (PSG) is used as a gold standard method for classifying neonatal sleep patterns, but it is expensive and requires a lot of human involvement. Over the last two decades, multiple researchers are working on automatic sleep stage classification algorithms using electroencephalography (EEG), electrocardiography (ECG), and video. In this study, we present a comprehensive review of existing algorithms for neonatal sleep, their limitations and future recommendations. Additionally, a brief comparison of the extracted features, classification algorithms and evaluation parameters is reported in the proposed study

    An effective approach for plant leaf diseases classification based on a novel DeepPlantNet deep learning model

    Get PDF
    IntroductionRecently, plant disease detection and diagnosis procedures have become a primary agricultural concern. Early detection of plant diseases enables farmers to take preventative action, stopping the disease's transmission to other plant sections. Plant diseases are a severe hazard to food safety, but because the essential infrastructure is missing in various places around the globe, quick disease diagnosis is still difficult. The plant may experience a variety of attacks, from minor damage to total devastation, depending on how severe the infections are. Thus, early detection of plant diseases is necessary to optimize output to prevent such destruction. The physical examination of plant diseases produced low accuracy, required a lot of time, and could not accurately anticipate the plant disease. Creating an automated method capable of accurately classifying to deal with these issues is vital. MethodThis research proposes an efficient, novel, and lightweight DeepPlantNet deep learning (DL)-based architecture for predicting and categorizing plant leaf diseases. The proposed DeepPlantNet model comprises 28 learned layers, i.e., 25 convolutional layers (ConV) and three fully connected (FC) layers. The framework employed Leaky RelU (LReLU), batch normalization (BN), fire modules, and a mix of 3Ă—3 and 1Ă—1 filters, making it a novel plant disease classification framework. The Proposed DeepPlantNet model can categorize plant disease images into many classifications.ResultsThe proposed approach categorizes the plant diseases into the following ten groups: Apple_Black_rot (ABR), Cherry_(including_sour)_Powdery_mildew (CPM), Grape_Leaf_blight_(Isariopsis_Leaf_Spot) (GLB), Peach_Bacterial_spot (PBS), Pepper_bell_Bacterial_spot (PBBS), Potato_Early_blight (PEB), Squash_Powdery_mildew (SPM), Strawberry_Leaf_scorch (SLS), bacterial tomato spot (TBS), and maize common rust (MCR). The proposed framework achieved an average accuracy of 98.49 and 99.85in the case of eight-class and three-class classification schemes, respectively.DiscussionThe experimental findings demonstrated the DeepPlantNet model's superiority to the alternatives. The proposed technique can reduce financial and agricultural output losses by quickly and effectively assisting professionals and farmers in identifying plant leaf diseases

    An Intelligent Traffic Surveillance System Using Integrated Wireless Sensor Network and Improved Phase Timing Optimization

    No full text
    The transportation industry is crucial to the realization of a smart city. However, the current growth in vehicle numbers is not being matched by an increase in road capacity. Congestion may boost the number of accidents, harm economic growth, and result in higher gas emissions. Currently, traffic congestion is seen as a severe threat to urban life. Suffering as a result of increased car traffic, insufficient infrastructure, and inefficient traffic management has exceeded the tolerance limit. Since route decisions are typically made in a short amount of time, the visualization of the data must be presented in a highly conceivable way. Also, the data generated by the transportation system face difficulties in processing and sometimes lack effective usage in certain fields. Hence, to overcome the challenges in computer vision, a novel computer vision-based traffic management system is proposed by integrating a wireless sensor network (WSN) and visual analytics framework. This research aimed to analyze average message delivery, average latency, average access, average energy consumption, and network performance. Wireless sensors are used in the study to collect road metrics, quantify them, and then rank them for entry. For optimization of the traffic data, improved phase timing optimization (IPTO) was used. The whole experimentation was carried out in a virtual environment. It was observed from the experimental results that the proposed approach outperformed other existing approaches

    A Novel Blockchain-Based Encryption Model to Protect Fog Nodes from Behaviors of Malicious Nodes

    No full text
    The world has experienced a huge advancement in computing technology. People prefer outsourcing their confidential data for storage and processing in cloud computing because of the auspicious services provided by cloud service providers. As promising as this paradigm is, it creates issues, including everything from data security to time latency with data computation and delivery to end-users. In response to these challenges, the fog computing paradigm was proposed as an extension of cloud computing to overcome the time latency and communication overhead and to bring computing and storage resources close to both the ground and the end-users. However, fog computing inherits the same security and privacy challenges encountered by traditional cloud computing. This paper proposed a fine-grained data access control approach by integrating the ciphertext policy attribute-based encryption (CP-ABE) algorithm and blockchain technology to secure end-users’ data security against rogue fog nodes in case a compromised fog node is ousted. In this approach, we proposed federations of fog nodes that share the same attributes, such as services and locations. The fog federation concept minimizes the time latency and communication overhead between fog nodes and cloud servers. Furthermore, the blockchain idea and the CP-ABE algorithm integration allow for fog nodes within the same fog federation to conduct a distributed authorization process. Besides that, to address time latency and communication overhead issues, we equip each fog node with an off-chain database to store the most frequently accessed data files for a particular time, as well as an on-chain access control policies table (on-chain files tracking table) that must be protected from tampering by rogue fog nodes. As a result, the blockchain plays a critical role here because it is tamper-proof by nature. We assess our approach’s efficiency and feasibility by conducting a simulation and analyzing its security and performance

    Transfer Learning-Based Automatic Hurricane Damage Detection Using Satellite Images

    No full text
    After the occurrence of a hurricane, assessing damage is extremely important for the emergency managers so that relief aid could be provided to afflicted people. One method of assessing the damage is to determine the damaged and the undamaged buildings post-hurricane. Normally, damage assessment is performed by conducting ground surveys, which are time-consuming and involve immense effort. In this paper, transfer learning techniques have been used for determining damaged and undamaged buildings in post-hurricane satellite images. Four different transfer learning techniques, which include VGG16, MobileNetV2, InceptionV3 and DenseNet121, have been applied to 23,000 Hurricane Harvey satellite images, which occurred in the Texas region. A comparative analysis of these models has been performed on the basis of the number of epochs and the optimizers used. The performance of the VGG16 pre-trained model was better than the other models and achieved an accuracy of 0.75, precision of 0.74, recall of 0.95 and F1-score of 0.83 when the Adam optimizer was used. When the comparison of the best performing models was performed in terms of various optimizers, VGG16 produced the best accuracy of 0.78 for the RMSprop optimizer

    A Novel Blockchain-Based Encryption Model to Protect Fog Nodes from Behaviors of Malicious Nodes

    No full text
    The world has experienced a huge advancement in computing technology. People prefer outsourcing their confidential data for storage and processing in cloud computing because of the auspicious services provided by cloud service providers. As promising as this paradigm is, it creates issues, including everything from data security to time latency with data computation and delivery to end-users. In response to these challenges, the fog computing paradigm was proposed as an extension of cloud computing to overcome the time latency and communication overhead and to bring computing and storage resources close to both the ground and the end-users. However, fog computing inherits the same security and privacy challenges encountered by traditional cloud computing. This paper proposed a fine-grained data access control approach by integrating the ciphertext policy attribute-based encryption (CP-ABE) algorithm and blockchain technology to secure end-users’ data security against rogue fog nodes in case a compromised fog node is ousted. In this approach, we proposed federations of fog nodes that share the same attributes, such as services and locations. The fog federation concept minimizes the time latency and communication overhead between fog nodes and cloud servers. Furthermore, the blockchain idea and the CP-ABE algorithm integration allow for fog nodes within the same fog federation to conduct a distributed authorization process. Besides that, to address time latency and communication overhead issues, we equip each fog node with an off-chain database to store the most frequently accessed data files for a particular time, as well as an on-chain access control policies table (on-chain files tracking table) that must be protected from tampering by rogue fog nodes. As a result, the blockchain plays a critical role here because it is tamper-proof by nature. We assess our approach’s efficiency and feasibility by conducting a simulation and analyzing its security and performance

    A Novel CovidDetNet Deep Learning Model for Effective COVID-19 Infection Detection Using Chest Radiograph Images

    No full text
    The suspected cases of COVID-19 must be detected quickly and accurately to avoid the transmission of COVID-19 on a large scale. Existing COVID-19 diagnostic tests are slow and take several hours to generate the required results. However, on the other hand, most X-rays or chest radiographs only take less than 15 min to complete. Therefore, we can utilize chest radiographs to create a solution for early and accurate COVID-19 detection and diagnosis to reduce COVID-19 patient treatment problems and save time. For this purpose, CovidDetNet is proposed, which comprises ten learnable layers that are nine convolutional layers and one fully-connected layer. The architecture uses two activation functions: the ReLu activation function and the Leaky Relu activation function and two normalization operations that are batch normalization and cross channel normalization, making it a novel COVID-19 detection model. It is a novel deep learning-based approach that automatically and reliably detects COVID-19 using chest radiograph images. Towards this, a fine-grained COVID-19 classification experiment is conducted to identify and classify chest radiograph images into normal, COVID-19 positive, and pneumonia. In addition, the performance of the proposed novel CovidDetNet deep learning model is evaluated on a standard COVID-19 Radiography Database. Moreover, we compared the performance of our approach with hybrid approaches in which we used deep learning models as feature extractors and support vector machines (SVM) as a classifier. Experimental results on the dataset showed the superiority of the proposed CovidDetNet model over the existing methods. The proposed CovidDetNet outperformed the baseline hybrid deep learning-based models by achieving a high accuracy of 98.40%
    corecore